A formal language is a set of words—that is, strings of symbols drawn from a common alphabet (a finite set of symbols, letters, or tokens from which the words of the language may be formed). A formal language is often defined by means of a formal grammar (also called its formation rules). Words that belong to a formal language are sometimes called well-formed words or well-formed formulas.
Formal languages are studied in computer science and linguistics. The field of formal language theory studies the purely syntactical aspects of such languages—that is, their internal structural patterns.
In computer science, formal languages are often used as the basis for defining programming languages and other systems in which the words of the language are associated with particular meanings or semantics. Software for parsing formal languages may be generated by means of a compiler compiler, often separated into a lexical analyzer and parser generator. Since formal languages alone do not have semantics, other constructs are needed for the specification of program semantics. In computational complexity theory, decision problems are typically defined as formal languages, and complexity classes are defined as the sets of the formal languages that can be parsed by machines with limited computational power. In logic and in the foundations of mathematics, formal languages are used to represent the syntax of formal theories. Logical systems can be seen as a formal language with additional constructs, like proof calculi, which define a consequence relation.[1] Alfred Tarski's definition of truth, in terms of a T-schema for first-order logic, is an example of fully interpreted formal language: all its sentences have meanings that make them either true or false.
In less technical contexts, the term artificial language is sometimes used to denote a formal language, although the latter term can also refer to artificially constructed natural languages.
Contents |
The first formal language is thought be the one used by Gottlob Frege in his Begriffsschrift (1879), literally meaning "concept writing", and which Frege described as a "formal language of pure thought."[2]
An alphabet, in the context of formal languages, can be any set, although it often makes sense to use an alphabet in the usual sense of the word, or more generally a character set such as ASCII. Alphabets can also be infinite; e.g. first-order logic is often expressed using an alphabet which, besides symbols such as ∧, ¬, ∀ and parentheses, contains infinitely many elements x0, x1, x2, … that play the role of variables. The elements of an alphabet are called its letters.
A word over an alphabet can be any finite sequence, or string, of letters. The set of all words over an alphabet Σ is usually denoted by Σ* (using the Kleene star). For any alphabet there is only one word of length 0, the empty word, which is often denoted by e, ε or λ. By concatenation one can combine two words to form a new word, whose length is the sum of the lengths of the original words. The result of concatenating a word with the empty word is the original word.
In some applications, especially in logic, the alphabet is also known as the vocabulary and words are known as formulas or sentences; this breaks the letter/word metaphor and replaces it by a word/sentence metaphor.
A formal language L over an alphabet Σ is a subset of Σ*, that is, a set of words over that alphabet.
In computer science and mathematics, which do not usually deal with natural languages, the adjective "formal" is often omitted as redundant.
While formal language theory usually concerns itself with formal languages that are described by some syntactical rules, the actual definition of the concept "formal language" is only as above: a (possibly infinite) set of finite-length strings, no more nor less. In practice, there are many languages that can be described by rules, such as regular languages or context-free languages. The notion of a formal grammar may be closer to the intuitive concept of a "language," one described by syntactic rules. By an abuse of the definition, a particular formal language is often thought of as being equipped with a formal grammar that describes it.
The following rules describe a formal language L over the alphabet Σ = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, +, =}:
Under these rules, the string "23+4=555" is in L, but the string "=234=+" is not. This formal language expresses natural numbers, well-formed addition statements, and well-formed addition equalities, but it expresses only what they look like (their syntax), not what they mean (semantics). For instance, nowhere in these rules is there any indication that 0 means the number zero, or that + means addition.
For finite languages one can simply enumerate all well-formed words. For example, we can describe a language L as just L = {"a", "b", "ab", "cba"}.
However, even over a finite (non-empty) alphabet such as Σ = {a, b} there are infinitely many words: "a", "abb", "ababba", "aaababbbbaab", …. Therefore formal languages are typically infinite, and describing an infinite formal language is not as simple as writing L = {"a", "b", "ab", "cba"}. Here are some examples of formal languages:
Formal language theory rarely concerns itself with particular languages (except as examples), but is mainly concerned with the study of various types of formalisms to describe languages. For instance, a language can be given as
Typical questions asked about such formalisms include:
Surprisingly often, the answer to these decision problems is "it cannot be done at all", or "it is extremely expensive" (with a characterization of how expensive). Therefore, formal language theory is a major application area of computability theory and complexity theory. Formal languages may be classified in the Chomsky hierarchy based on the expressive power of their generative grammar as well as the complexity of their recognizing automaton. Context-free grammars and regular grammars provide a good compromise between expressivity and ease of parsing, and are widely used in practical applications.
Certain operations on languages are common. This includes the standard set operations, such as union, intersection, and complement. Another class of operation is the element-wise application of string operations.
Examples: suppose L1 and L2 are languages over some common alphabet.
Such string operations are used to investigate closure properties of classes of languages. A class of languages is closed under a particular operation when the operation, applied to languages in the class, always produces a language in the same class again. For instance, the context-free languages are known to be closed under union, concatenation, and intersection with regular languages, but not closed under intersection or complement. The theory of trios and abstract families of languages studies the most common closure properties of language families in their own right.[3]
Operation | Regular | DCFL | CFL | IND | CSL | recursive | RE | |
---|---|---|---|---|---|---|---|---|
Union | Yes | No | Yes | Yes | Yes | Yes | Yes | |
Intersection | Yes | No | No | No | Yes | Yes | Yes | |
Complement | Yes | Yes | No | No | Yes | Yes | No | |
Concatenation | Yes | No | Yes | Yes | Yes | Yes | Yes | |
Kleene star | Yes | No | Yes | Yes | Yes | Yes | Yes | |
Homomorphism | Yes | No | Yes | Yes | No | No | Yes | |
e-free Homomorphism | Yes | No | Yes | Yes | Yes | Yes | Yes | |
Substitution | Yes | No | Yes | Yes | Yes | No | Yes | |
Inverse Homomorphism | Yes | Yes | Yes | Yes | Yes | Yes | Yes | |
Reverse | Yes | No | Yes | Yes | Yes | Yes | Yes | |
Intersection with a regular language | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
A compiler usually has two distinct components. A lexical analyzer, generated by a tool like lex
, identifies the tokens of the programming language grammar, e.g. identifiers or keywords, which are themselves expressed in a simpler formal language, usually by means of regular expressions. At the most basic conceptual level, a parser, usually generated by a parser generator like yacc
, attempts to decide if the source program is valid, that is if it belongs to the programming language for which the compiler was built. Of course, compilers do more than just parse the source code—they usually translate it into some executable format. Because of this, a parser usually outputs more than a yes/no answer, typically an abstract syntax tree, which is used by subsequent stages of the compiler to eventually generate an executable containing machine code that runs directly on the hardware, or some intermediate code that requires a virtual machine to execute.
In mathematical logic, a formal theory is a set of sentences expressed in a formal language.
A formal system (also called a logical calculus, or a logical system) consists of a formal language together with a deductive apparatus (also called a deductive system). The deductive apparatus may consist of a set of transformation rules which may be interpreted as valid rules of inference or a set of axioms, or have both. A formal system is used to derive one expression from one or more other expressions. Although a formal language can be identified with its formulas, a formal system cannot be likewise identified by its theorems. Two formal systems and may have all the same theorems and yet differ in some significant proof-theoretic way (a formula A may be a syntactic consequence of a formula B in one but not another for instance).
A formal proof or derivation is a finite sequence of well-formed formulas (which may be interpreted as propositions) each of which is an axiom or follows from the preceding formulas in the sequence by a rule of inference. The last sentence in the sequence is a theorem of a formal system. Formal proofs are useful because their theorems can be interpreted as true propositions.
Formal languages are entirely syntactic in nature but may be given semantics that give meaning to the elements of the language. For instance, in mathematical logic, the set of possible formulas of a particular logic is a formal language, and an interpretation assigns a meaning to each of the formulas—usually, a truth value.
The study of interpretations of formal languages is called formal semantics. In mathematical logic, this is often done in terms of model theory. In model theory, the terms that occur in a formula are interpreted as mathematical structures, and fixed compositional interpretation rules determine how the truth value of the formula can be derived from the interpretation of its terms; a model for a formula is an interpretation of terms such that the formula becomes true.
|